206 research outputs found
Inertial navigation systems for mobile robots
Cataloged from PDF version of article.A low-cost solid-state inertial navigation system
(INS) for mobile robotics applications is described. Error models
for the inertial sensors are generated and included in an Extended
Kalman Filter (EKF) for estimating the position and orientation
of a moving robot vehicle. Two Merent solid-state gyroscopes
have been evaluated for estimating the orientation of the robot.
Performance of the gyroscopes with error models is compared to
the performance when the error models are excluded from the
system. The results demonstrate that without error compensation,
the error in orientation is between 5-15"/min but can be improved
at least by a factor of 5 if an adequate error model is supplied.
Siar error models have been developed for each axis of a solid-state triaxial accelerometer and for a conducting-bubble tilt sensor which may also be used as a low-cost accelerometer. Linear
position estimation with information from accelerometers and tilt sensors is more susceptible to errors due to the double integration
process involved in estimating position. With the system described
here, the position drift rate is 1-8 cds, depending on the frequency
of acceleration changes. An integrated inertial platform
consisting of three gyroscopes, a triaxial accelerometer and two
tilt sensors is described. Results from tests of this platform on a large outdoor mobile robot system are described and compared to
the results obtained from the robot's own radar-based guidance
system. Like all inertial systems, the platform requires additional
information from some absolute position-sensing mechanism to
overcome long-term drift. However, the results show that with
careful and detailed modeling of error sources, low-cost inertial
sensing systems can provide valuable orientation and position
information particularly for outdoor mobile robot applications
Evaluation of solid-state gyroscope for robotics applications
Cataloged from PDF version of article.he evaluation of a low-cost solid-state gyroscope
for robotics applications is described. An error model for the
sensor is generated and included in a Kalman filter for estimating
the orientation of a moving robot vehicle. Orientation eshation
with the error model is compared to the performance when the
error model is excluded from the system. The results demonstrate
that without error compensation, the error in localization is
between 5-15"/min but can be improved at least by a factor
of 5 if an adequate error model is supplied. Like all inertial
systems, the platform requires additional information from some
absolute position-sensing mechanism to overcome long-term drift.
However, the results show that with careful and detailed modeling
of error sources, inertial sensors can provide valuable orientation
information for mobile robot applications
Load Balancing for Mobility-on-Demand Systems
In this paper we develop methods for maximizing the throughput of a mobility-on-demand urban transportation system. We consider a finite group of shared vehicles, located at a set of stations. Users arrive at the stations, pick-up vehicles, and drive (or are driven) to their destination station where they drop-off the vehicle. When some origins and destinations are more popular than others, the system will inevitably become out of balance: Vehicles will build up at some stations, and become depleted at others. We propose a robotic solution to this rebalancing problem that involves empty robotic vehicles autonomously driving between stations. We develop a rebalancing policy that minimizes the number of vehicles performing rebalancing trips. To do this, we utilize a fluid model for the customers and vehicles in the system. The model takes the form of a set of nonlinear time-delay differential equations. We then show that the optimal rebalancing policy can be found as the solution to a linear program. By analyzing the dynamical system model, we show that every station reaches an equilibrium in which there are excess vehicles and no waiting customers.We use this solution to develop a real-time rebalancing policy which can operate in highly variable environments. We verify policy performance in a simulated mobility-on-demand environment with stochastic features found in real-world urban transportation networks
A sensor fusion layer to cope with reduced visibility in SLAM
Mapping and navigating with mobile robots in scenarios with reduced visibility, e.g. due to smoke, dust, or fog, is still a big challenge nowadays. In spite of the tremendous advance on Simultaneous Localization and Mapping (SLAM) techniques for the past decade, most of current algorithms fail in those environments because they usually rely on optical sensors providing dense range data, e.g. laser range finders, stereo vision, LIDARs, RGB-D, etc., whose measurement process is highly disturbed by particles of smoke, dust, or steam. This article addresses the problem of performing SLAM under reduced visibility conditions by proposing a sensor fusion layer which takes advantage from complementary characteristics between a laser range finder (LRF) and an array of sonars. This sensor fusion layer is ultimately used with a state-of-the-art SLAM technique to be resilient in scenarios where visibility cannot be assumed at all times. Special attention is given to mapping using commercial off-the-shelf (COTS) sensors, namely arrays of sonars which, being usually available in robotic platforms, raise technical issues that were investigated in the course of this work. Two sensor fusion methods, a heuristic method and a fuzzy logic-based method, are presented and discussed, corresponding to different stages of the research work conducted. The experimental validation of both methods with two different mobile robot platforms in smoky indoor scenarios showed that they provide a robust solution, using only COTS sensors, for adequately coping with reduced visibility in the SLAM process, thus decreasing significantly its impact in the mapping and localization results obtained
A sensor fusion layer to cope with reduced visibility in SLAM
Mapping and navigating with mobile robots in scenarios with reduced visibility, e.g. due to smoke, dust, or fog, is still a big challenge nowadays. In spite of the tremendous advance on Simultaneous Localization and Mapping (SLAM) techniques for the past decade, most of current algorithms fail in those environments because they usually rely on optical sensors providing dense range data, e.g. laser range finders, stereo vision, LIDARs, RGB-D, etc., whose measurement process is highly disturbed by particles of smoke, dust, or steam. This article addresses the problem of performing SLAM under reduced visibility conditions by proposing a sensor fusion layer which takes advantage from complementary characteristics between a laser range finder (LRF) and an array of sonars. This sensor fusion layer is ultimately used with a state-of-the-art SLAM technique to be resilient in scenarios where visibility cannot be assumed at all times. Special attention is given to mapping using commercial off-the-shelf (COTS) sensors, namely arrays of sonars which, being usually available in robotic platforms, raise technical issues that were investigated in the course of this work. Two sensor fusion methods, a heuristic method and a fuzzy logic-based method, are presented and discussed, corresponding to different stages of the research work conducted. The experimental validation of both methods with two different mobile robot platforms in smoky indoor scenarios showed that they provide a robust solution, using only COTS sensors, for adequately coping with reduced visibility in the SLAM process, thus decreasing significantly its impact in the mapping and localization results obtained
Robot Mapping and Localisation for Feature Sparse Water Pipes Using Voids as Landmarks
Robotic systems for water pipe inspection do not generally include navigation components for mapping the pipe network and locating damage. Such navigation systems would be highly advantageous for water companies because it would allow them to more effectively target maintenance and reduce costs. In water pipes, a major challenge for robot navigation is feature sparsity. In order to address this problem, a novel approach for robot navigation in water pipes is developed here, which uses a new type of landmark feature - voids outside the pipe wall, sensed by ultrasonic scanning. The method was successfully demonstrated in a laboratory environment and showed for the first time the potential of using voids for robot navigation in water pipes
SLAM algorithm applied to robotics assistance for navigation in unknown environments
<p>Abstract</p> <p>Background</p> <p>The combination of robotic tools with assistance technology determines a slightly explored area of applications and advantages for disability or elder people in their daily tasks. Autonomous motorized wheelchair navigation inside an environment, behaviour based control of orthopaedic arms or user's preference learning from a friendly interface are some examples of this new field. In this paper, a Simultaneous Localization and Mapping (SLAM) algorithm is implemented to allow the environmental learning by a mobile robot while its navigation is governed by electromyographic signals. The entire system is part autonomous and part user-decision dependent (semi-autonomous). The environmental learning executed by the SLAM algorithm and the low level behaviour-based reactions of the mobile robot are robotic autonomous tasks, whereas the mobile robot navigation inside an environment is commanded by a Muscle-Computer Interface (MCI).</p> <p>Methods</p> <p>In this paper, a sequential Extended Kalman Filter (EKF) feature-based SLAM algorithm is implemented. The features correspond to lines and corners -concave and convex- of the environment. From the SLAM architecture, a global metric map of the environment is derived. The electromyographic signals that command the robot's movements can be adapted to the patient's disabilities. For mobile robot navigation purposes, five commands were obtained from the MCI: turn to the left, turn to the right, stop, start and exit. A kinematic controller to control the mobile robot was implemented. A low level behavior strategy was also implemented to avoid robot's collisions with the environment and moving agents.</p> <p>Results</p> <p>The entire system was tested in a population of seven volunteers: three elder, two below-elbow amputees and two young normally limbed patients. The experiments were performed within a closed low dynamic environment. Subjects took an average time of 35 minutes to navigate the environment and to learn how to use the MCI. The SLAM results have shown a consistent reconstruction of the environment. The obtained map was stored inside the Muscle-Computer Interface.</p> <p>Conclusions</p> <p>The integration of a highly demanding processing algorithm (SLAM) with a MCI and the communication between both in real time have shown to be consistent and successful. The metric map generated by the mobile robot would allow possible future autonomous navigation without direct control of the user, whose function could be relegated to choose robot destinations. Also, the mobile robot shares the same kinematic model of a motorized wheelchair. This advantage can be exploited for wheelchair autonomous navigation.</p
№ 1. Прохання парафіяльної Ради Ново-Слов’янського кафедрального собору до отця О. Ярещенка про відправлення архієрейської служби на «Зелені Свята» від 18 травня 1923 р.
Публікація Д. Бойко, І. Бухарєвої, С. Кокіна, П. Кулаковського
- …